2 research outputs found

    Data-driven body–machine interface for the accurate control of drones

    Get PDF
    The teleoperation of nonhumanoid robots is often a demanding task, as most current control interfaces rely on mappings between the operator’s and the robot’s actions, which are determined by the design and characteristics of the interface, and may therefore be challenging to master. Here, we describe a structured methodology to identify common patterns in spontaneous interaction behaviors, to implement embodied user interfaces, and to select the appropriate sensor type and positioning. Using this method, we developed an intuitive, gesture-based control interface for real and simulated drones, which outperformed a standard joystick in terms of learning time and steering abilities. Implementing this procedure to identify body-machine patterns for specific applications could support the development of more intuitive and effective interfaces

    Body-machine interfaces for non-homologous human-machine interactions

    No full text
    Virtual reality (VR), the interactive experience of being immersed in a simulated environment, has seen a tremendous development in the last years. Numerous applications came into being, ranging from flight simulators through a virtual ascent of Mount Everest, through surgery simulators or scenarios to treat acrophobia. These applications serve different purposes and are not designed for the same populations. To optimize the VR experience, these aspects must be taken into account during the development of virtual environments. The intensity of a virtual experience depends on three main factors: the quality and rendering of the virtual environment, the interaction opportunities and aspects inherent to the user such as physical abilites or previous VR experiences.This thesis addresses the latter two aspects. Part I describes the development of a body-machine interface for the immersive steering of simulated or real drones. Chapter 2 describes a systematic analysis of spontaneous gestural strategies selected by untrained participants asked to interact with a drone. I show the existence of patterns common to the considered population. In chapter 3, I use these patterns to define gestural control strategies to pilot a drone. In particular, I demonstrate that using a set of torso movements leads to better steering performances and faster learning than with a joystick commonly used for this kink of tasks. In Part II, I focused on multisensory integration, which is necessary to interact with such a system, and the development thereof along childhood. In chapter 4, I evaluated the steering abilities of 6-10 year-old children on the flight simulator developed earlier. These experiments revealed that the selection of the appropriate head-trunk coordination strategy is immature in children until the age of 8, even if these strategies are already part of the postural repertoire. Eventually, in chapter 5, a virtual archery game highlighted the development of visuomotor integration during childhood. These studies emphasized the benefits of user-driven interaction interfaces over pre-existing devices and brought up age-related interaction differences which should be considered when designing virtual environments
    corecore